Iterative Regularization in Nonparametric Instrumental Regression
نویسندگان
چکیده
We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an ill-posed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator.
منابع مشابه
Nonparametric instrumental regression with non-convex constraints
This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variab...
متن کاملMETHODS FOR NONPARAMETRIC AND SEMIPARAMETRIC REGRESSIONS WITH ENDOGENEITY: A GENTLE GUIDE By
This paper reviews recent advances in estimation and inference for nonparametric and semiparametric models with endogeneity. It first describes methods of sieves and penalization for estimating unknown functions identified via conditional moment restrictions. Examples include nonparametric instrumental variables regression (NPIV), nonparametric quantile IV regression and many more semi-nonparam...
متن کاملComponent Selection and Smoothing in Multivariate Nonparametric Regression
We propose a new method for model selection and model fitting in multivariate nonparametric regression models, in the framework of smoothing spline ANOVA. The “COSSO” is a method of regularization with the penalty functional being the sum of component norms, instead of the squared norm employed in the traditional smoothing spline method. The COSSO provides a unified framework for several recent...
متن کاملAutomatic Smoothing and Variable Selection via Regularization
This thesis focuses on developing computational methods and the general theory of automatic smoothing and variable selection via regularization. Methods of regularization are a commonly used technique to get stable solution to ill-posed problems such as nonparametric regression and classification. In recent years, methods of regularization have also been successfully introduced to address a cla...
متن کاملIterative Regularization for Learning with Convex Loss Functions
We consider the problem of supervised learning with convex loss functions and propose a new form of iterative regularization based on the subgradient method. Unlike other regularization approaches, in iterative regularization no constraint or penalization is considered, and generalization is achieved by (early) stopping an empirical iteration. We consider a nonparametric setting, in the framewo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010